On the relationship between densities of Shannon entropy and Fisher information for atoms and molecules.

نویسنده

  • Shubin Liu
چکیده

An analytical relationship between the densities of the Shannon entropy and Fisher information for atomic and molecular systems has been established in this work. Two equivalent forms of the Fisher information density are introduced as well. It is found that for electron densities of atoms and molecules the Shannon entropy density is intrinsically related to the electron density and the two forms of the Fisher information density. The formulas have been confirmed by the numerical results for the first two-row atoms.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

A New Model for Best Customer Segment Selection Using Fuzzy TOPSIS Based on Shannon Entropy

In today’s competitive market, for a business firm to win higher profit among its rivals, it is of necessity to evaluate, and rank its potential customer segments to improve its Customer Relationship Management (CRM). This brings the importance of having more efficient decision making methods considering the current fast growing information era. These decisions usually involve several criteria,...

متن کامل

Tsallis Entropy and Conditional Tsallis Entropy of Fuzzy Partitions

The purpose of this study is to define the concepts of Tsallis entropy and conditional Tsallis entropy of fuzzy partitions and to obtain some results concerning this kind entropy. We show that the Tsallis entropy of fuzzy partitions has the subadditivity and concavity properties. We study this information measure under the refinement and zero mode subset relations. We check the chain rules for ...

متن کامل

Jensen divergence based on Fisher's information

The measure of Jensen-Fisher divergence between probability distributions is introduced and its theoretical grounds set up. This quantity, in contrast to the remaining Jensen divergences, is very sensitive to the fluctuations of the probability distributions because it is controlled by the (local) Fisher information, which is a gradient functional of the distribution. So, it is appropriate and ...

متن کامل

Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...

متن کامل

Dynamics of Uncertainty in Nonequilibrium Random Motion

Shannon information entropy is a natural measure of probability (de)localization and thus (un)predictability in various procedures of data analysis for model systems. We pay particular attention to links between the Shannon entropy and the related Fisher information notion, which jointly account for the shape and extension of continuous probability distributions. Classical, dynamical and random...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • The Journal of chemical physics

دوره 126 19  شماره 

صفحات  -

تاریخ انتشار 2007